10 research outputs found

    Evaluación de diseños de un Convertidor de Turbina de Viento considerando el comportamiento térmico

    Get PDF
    Evaluar el diseño de un convertidor de electrónica de potencia de turbinas de viento considerando el comportamiento térmico cuando está en servicio. Cuando se diseñan convertidores de electrónica de potencia para turbinas de viento, muy frecuentemente solo la tempetratura de unión máxima entre semiconductores es considerada. Para la aplicación de una turbina de viento, el ciclo térmico debido a la fluctuación de potencia puede también retardar el tiempo de vida de los semiconductores de potencia. De ahí que un sobre diseño sea necesario para incrementar su periodo de vida y para determinarlo, es necesario realizar una estimación del periodo de vida de las unidades semiconductoras de potencia

    Going Incognito in the Metaverse

    Full text link
    Virtual reality (VR) telepresence applications and the so-called "metaverse" promise to be the next major medium of interaction with the internet. However, with numerous recent studies showing the ease at which VR users can be profiled, deanonymized, and data harvested, metaverse platforms carry all the privacy risks of the current internet and more while at present having none of the defensive privacy tools we are accustomed to using on the web. To remedy this, we present the first known method of implementing an "incognito mode" for VR. Our technique leverages local differential privacy to quantifiably obscure sensitive user data attributes, with a focus on intelligently adding noise when and where it is needed most to maximize privacy while minimizing usability impact. Moreover, our system is capable of flexibly adapting to the unique needs of each metaverse application to further optimize this trade-off. We implement our solution as a universal Unity (C#) plugin that we then evaluate using several popular VR applications. Upon faithfully replicating the most well-known VR privacy attack studies, we show a significant degradation of attacker capabilities when using our proposed solution

    Exploring the Unprecedented Privacy Risks of the Metaverse

    Full text link
    Thirty study participants playtested an innocent-looking "escape room" game in virtual reality (VR). Behind the scenes, an adversarial program had accurately inferred over 25 personal data attributes, from anthropometrics like height and wingspan to demographics like age and gender, within just a few minutes of gameplay. As notoriously data-hungry companies become increasingly involved in VR development, this experimental scenario may soon represent a typical VR user experience. While virtual telepresence applications (and the so-called "metaverse") have recently received increased attention and investment from major tech firms, these environments remain relatively under-studied from a security and privacy standpoint. In this work, we illustrate how VR attackers can covertly ascertain dozens of personal data attributes from seemingly-anonymous users of popular metaverse applications like VRChat. These attackers can be as simple as other VR users without special privilege, and the potential scale and scope of this data collection far exceed what is feasible within traditional mobile and web applications. We aim to shed light on the unique privacy risks of the metaverse, and provide the first holistic framework for understanding intrusive data harvesting attacks in these emerging VR ecosystems

    Lessons Learned: Surveying the Practicality of Differential Privacy in the Industry

    Full text link
    Since its introduction in 2006, differential privacy has emerged as a predominant statistical tool for quantifying data privacy in academic works. Yet despite the plethora of research and open-source utilities that have accompanied its rise, with limited exceptions, differential privacy has failed to achieve widespread adoption in the enterprise domain. Our study aims to shed light on the fundamental causes underlying this academic-industrial utilization gap through detailed interviews of 24 privacy practitioners across 9 major companies. We analyze the results of our survey to provide key findings and suggestions for companies striving to improve privacy protection in their data workflows and highlight the necessary and missing requirements of existing differential privacy tools, with the goal of guiding researchers working towards the broader adoption of differential privacy. Our findings indicate that analysts suffer from lengthy bureaucratic processes for requesting access to sensitive data, yet once granted, only scarcely-enforced privacy policies stand between rogue practitioners and misuse of private information. We thus argue that differential privacy can significantly improve the processes of requesting and conducting data exploration across silos, and conclude that with a few of the improvements suggested herein, the practical use of differential privacy across the enterprise is within striking distance

    Towards verifiable differentially-private polling

    Get PDF
    Analyses that fulfill differential privacy provide plausible deniability to individuals while allowing analysts to extract insights from data. However, beyond an often acceptable accuracy tradeoff, these statistical disclosure techniques generally inhibit the verifiability of the provided information, as one cannot check the correctness of the participants’ truthful information, the differentially private mechanism, or the unbiased random number generation. While related work has already discussed this opportunity, an efficient implementation with a precise bound on errors and corresponding proofs of the differential privacy property is so far missing. In this paper, we follow an approach based on zero-knowledge proofs (ZKPs), in specific succinct non-interactive arguments of knowledge, as a verifiable computation technique to prove the correctness of a differentially private query output. In particular, we ensure the guarantees of differential privacy hold despite the limitations of ZKPs that operate on finite fields and have limited branching capabilities. We demonstrate that our approach has practical performance and discuss how practitioners could employ our primitives to verifiably query individuals’ age from their digitally signed ID card in a differentially private manner

    Revealing the Landscape of Privacy-Enhancing Technologies in the Context of Data Markets for the IoT: A Systematic Literature Review

    Get PDF
    IoT data markets in public and private institutions have become increasingly relevant in recent years because of their potential to improve data availability and unlock new business models. However, exchanging data in markets bears considerable challenges related to disclosing sensitive information. Despite considerable research focused on different aspects of privacy-enhancing data markets for the IoT, none of the solutions proposed so far seems to find a practical adoption. Thus, this study aims to organize the state-of-the-art solutions, analyze and scope the technologies that have been suggested in this context, and structure the remaining challenges to determine areas where future research is required. To accomplish this goal, we conducted a systematic literature review on privacy enhancement in data markets for the IoT, covering 50 publications dated up to July 2020, and provided updates with 24 publications dated up to May 2022. Our results indicate that most research in this area has emerged only recently, and no IoT data market architecture has established itself as canonical. Existing solutions frequently lack the required combination of anonymization and secure computation technologies. Furthermore, there is no consensus on the appropriate use of blockchain technology for IoT data markets and a low degree of leveraging existing libraries or reusing generic data market architectures. We also identified significant challenges remaining, such as the copy problem and the recursive enforcement problem that-while solutions have been suggested to some extent-are often not sufficiently addressed in proposed designs. We conclude that privacy-enhancing technologies need further improvements to positively impact data markets so that, ultimately, the value of data is preserved through data scarcity and users' privacy and businesses-critical information are protected.Comment: 49 pages, 17 figures, 11 table

    Revealing the landscape of privacy-enhancing technologies in the context of data markets for the IoT: A systematic literature review

    Get PDF
    IoT data markets in public and private institutions have become increasingly relevant in recent years because of their potential to improve data availability and unlock new business models. However, exchanging data in markets bears considerable challenges related to disclosing sensitive information. Despite considerable research focused on different aspects of privacy-enhancing data markets for the IoT, none of the solutions proposed so far seems to find a practical adoption. Thus, this study aims to organize the state-of-the-art solutions, analyze and scope the technologies that have been suggested in this context, and structure the remaining challenges to determine areas where future research is required. To accomplish this goal, we conducted a systematic literature review on privacy enhancement in data markets for the IoT, covering 50 publications dated up to July 2020, and provided updates with 24 publications dated up to May 2022. Our results indicate that most research in this area has emerged only recently, and no IoT data market architecture has established itself as canonical. Existing solutions frequently lack the required combination of anonymization and secure computation technologies. Furthermore, there is no consensus on the appropriate use of blockchain technology for IoT data markets and a low degree of leveraging existing libraries or reusing generic data market architectures. We also identified significant challenges remaining, such as the copy problem and the recursive enforcement problem that - while solutions have been suggested to some extent - are often not sufficiently addressed in proposed designs. We conclude that privacy-enhancing technologies need further improvements to positively impact data markets so that, ultimately, the value of data is preserved through data scarcity and users' privacy and businesses-critical information are protected

    Exponential Randomized Response: Boosting Utility in Differentially Private Selection

    Full text link
    A differentially private selection algorithm outputs from a finite set the item that approximately maximizes a data-dependent quality function. The most widely adopted mechanisms tackling this task are the pioneering exponential mechanism and permute-and-flip, which can offer utility improvements of up to a factor of two over the exponential mechanism. This work introduces a new differentially private mechanism for private selection and conducts theoretical and empirical comparisons with the above mechanisms. For reasonably common scenarios, our mechanism can provide utility improvements of factors significantly larger than two over the exponential and permute-and-flip mechanisms. Because the utility can deteriorate in niche scenarios, we recommend our mechanism to analysts who can tolerate lower utility for some datasets.Comment: This algorithm only works under an assumption that is not realistic for the wider application of differential privac

    Towards Verifiable Differentially-Private Polling

    Get PDF
    Analyses that fulfill differential privacy provide plausible deniability to individuals while allowing analysts to extract insights from data. However, beyond an often acceptable accuracy tradeoff, these statistical disclosure techniques generally inhibit the verifiability of the provided information, as one cannot check the correctness of the participants' truthful information, the differentially private mechanism, or the unbiased random number generation. While related work has already discussed this opportunity, an efficient implementation with a precise bound on errors and corresponding proofs of the differential privacy property is so far missing. In this paper, we follow an approach based on zero-knowledge proofs~(ZKPs), in specific succinct non-interactive arguments of knowledge, as a verifiable computation technique to prove the correctness of a differentially private query output. In particular, we ensure the guarantees of differential privacy hold despite the limitations of ZKPs that operate on finite fields and have limited branching capabilities. We demonstrate that our approach has practical performance and discuss how practitioners could employ our primitives to verifiably query individuals' age from their digitally signed ID card in a differentially private manner

    Towards a Privacy-Enhancing Tool Based on De- Identification Methods

    No full text
    Advancements in information and communication technologies (ICTs), e.g. smartphones, wearables, and smart cars, enable the collection and processing of data containing sensitive information about consumers. However, while the collection, processing, and sharing of data realizes economic potentials, it also imposes privacy requirements. Thus, we envision the development of a privacy-enhancing tool based on de-identification methods to meet privacy requirements while simultaneously enabling data processing. In this research in progress paper, we present two initial contributions towards this goal consisting of an overview of de-identification methods based on a structured literature review, and a description of the use cases to evaluate the resulting tool
    corecore